![]() AUTOMATE FOR TREATING A SURFACE
专利摘要:
The present invention relates to an automaton (1) for the treatment of a surface to be treated, comprising a processing means (10), for example an arm, comprising a movable end (12) configured to treat a surface, and a configured interface to indicate to the PLC the surface to be treated. The interface includes a screen (21) configured to display a representation of at least a portion of the part in which the surface to be treated is located, and the interface is configured to allow a person to select the surface to be processed on the representation displayed on the screen (21). 公开号:FR3050672A1 申请号:FR1653918 申请日:2016-04-29 公开日:2017-11-03 发明作者:Antoine Rennuit 申请人:Antoine Rennuit; IPC主号:
专利说明:
Background of the invention The present invention relates to an automated treatment of a surface, for example painting. More particularly, the present invention relates to a control interface of such an automaton. There are already robots for treating a surface, for example to paint or sand. Such robots are used in the automotive field, for painting the bodies of vehicles. These robots are autonomous and are programmed to perform a series of operations without the control of an operator: they then replace one or more people by performing the tasks previously programmed in the robot control computer. However, these robots-painters, like other robots used on assembly lines, are motionless on the ground, and only the arm holding the tool, is mobile: their field of action is geographically limited. Moreover, and in order to avoid accidents with possible people who may be near the robots, they are generally surrounded by a safety zone in which the robot is brought to work and in which the individuals must not enter. There are also robots in the building. Such robots are used in particular to work in places that are difficult to access by individuals, or even banned. This is the case for example when the surface to be treated comprises high ceilings, or is located in an area with radiation risks. An example of a robot that can intervene under such conditions is described in particular in application FR 2 902 038. It can be seen that the robot is directly controlled by an operator, to treat the surface in its place and under its control. Thus, such robots are not autonomous, and need to be controlled continuously by an operator: The use of such a robot then makes it easier for the operator, but always requires his presence, whether it is beside or away from the robot. The robot then does not replace the operator to perform the operation, but rather acts as a tool that remains under the control of the operator and does not act alone. Thus, the robot does not allow to perform work in place of an individual, but still requires supervision. In particular, the robot can treat a given surface only with the instructions given by the operator during the processing of said surface, in order to adapt the actions performed by the robot in real time. Object and summary of the invention The present invention aims to solve the various technical problems mentioned above. In particular, the present invention aims at providing means for indicating to said controller the work to be performed autonomously. More particularly, the present invention aims at providing an interface facilitating the indication of instructions by an operator. Thus, according to one aspect of the invention, there is provided a controller for the treatment of a surface to be treated, comprising a processing means, for example an arm, comprising a mobile end configured to treat a surface, and a configured interface. to indicate to the PLC the surface to be treated. In particular, the interface comprises a screen configured to display a representation of at least a part of the part in which the surface to be treated is located, and the interface is configured to allow a person to select or draw the surface to be treated. deal with the representation displayed on the screen. Thanks to its interface, it becomes easier for the operator to indicate to the PLC the surfaces to be treated independently. In particular, the interface is configured to be intuitive and easy to use. It not only facilitates the entry of instructions, but also the verification of them before the PLC works. Preferentially, the interface is also configured to display, for example graphically or textually, information relating to the operation of the automaton, for example its operating parameters or anomalies. The interface then allows a dialogue between the PLC, which is intended to operate autonomously, and the operator responsible for monitoring its operation. By indicating the operating parameters and any anomalies, we make sure that the PLC works as desired and we can anticipate any future problems (paint refill for example). According to a first embodiment, the interface is configured to identify, on the part of the part displayed by the screen, at least one determined surface surrounded by a closed contour. According to this first embodiment, the interface is capable of identifying the different surfaces represented on its screen. Such identification makes it possible, in a second step, for the operator to choose the surfaces to be treated by selecting the surfaces identified by the interface. Preferably, the interface is configured to allow a person: to select the surface to be treated, by selection, on the representation displayed on the screen, of at least one determined surface surrounded by a closed contour, and possibly of exclude at least one portion not to treat said determined surface, by selecting, on the representation displayed on the screen, a determined portion surrounded by a closed contour located within said determined surface. As indicated previously, the different surfaces of the zone to be treated being identified by the interface, it is then enough for the operator to select those to be treated, and possibly those not to be treated, so that the interface can have its instructions to job. In addition, the interface can also indicate on the screen the surfaces selected or excluded by the operator, in order to facilitate the verification of the instructions given to the PLC. According to another embodiment, the interface is configured to allow a person: to select the surface to be treated, by realization, on the representation displayed on the screen, of at least one closed contour defining a delimited surface, and possibly to exclude at least a portion not to treat said delimited surface, by making, on the representation displayed on the screen, a closed contour defining a delimited portion located within said delimited surface. In this embodiment, the choice of surfaces to be treated is made by the operator, realizing the contour of said surface to be treated. However, the accuracy of the selection of the surfaces to be treated may be limited by the embodiment of the contour, and may thus require additional time compared to the first embodiment, to ensure that the limited indicated at the interface correspond to the desired limits. Preferably, the interface is configured to identify, on the part of the piece displayed by the screen, at least one singularity, for example one or more corners or one or more edges. The interface is also configured to modify the closed contour of at least one delimited surface to match one or more of said singularities. In this case, the interface adapts the contour made by the operator so as to match it with elements identifiable by the interface. In particular, in the case of a discrete geometrical model of the part, that is to say in the form of a set of points, the interface can then match certain portions of the contour drawn by the operator with a or several singularities of the piece determined from the discretized model. Preferably, the interface screen is tactile and / or the interface comprises a manual pointing device, separate from the screen, such as a mouse or a touchpad. These are different pointing means that can be used to select on the screen the contour or surface desired by the operator. Preferably, the interface is also configured to allow a person to associate one or more treatment parameters with the selected treatment surface. It is thus possible to associate different parameters with different surfaces identified separately. It is then possible to give unique instructions to the PLC, and let it perform the various tasks requested on the different surfaces chosen, even if the processing parameters are not the same. Preferably, the surface is selected by combining several sub-selections. More precisely, the work that can be done by the automaton is not limited to a single continuous surface, but may comprise different surfaces identified and distinct from each other. Alternatively, the surface to be treated may be continuous, but may be selected by addition and / or subtraction of several sub-selections. According to a first embodiment, the screen is configured to display the representation of at least a part of the part in which the surface to be treated is located, from a 3D modeling file of said part. In this first embodiment, the interface uses a 3D modeling of the part in which the automaton must work: it is possible in particular for the interface, to identify the different surfaces of said part through their modeling in the 3D file. According to another embodiment, the screen is configured to display the representation of at least a part of the room in which the surface to be treated is located, from the data obtained in the room, for example received from a means scanning said part such as a scanner or a 3D camera. In this case, the interface measures or records directly in the workpiece, the information necessary to identify a surface to be treated. The interface can scan the room, or use a 3D camera or a camera to represent a portion of the room on its screen. The interface can also implement image processing or exploit information entered by the operator (geometry, delimitation, ...) to determine the different parts of the part. The interface can, for example, determine the junctions and other singularities between surfaces as a function of the brightness thereof, geometric elements provided by the operator, or other information. Such an embodiment has the advantage of being directly operational in situ, and does not require prior modeling of the part. Preferably, the automaton also comprises: - a base configured to move on the ground, and - a platform mounted on the base and configured to move, at least in part, perpendicular to the base, for example vertically, the processing means being mounted on the platform. Such an automaton makes it possible to treat an extended surface by horizontal displacement thanks to its base, and vertical displacement thanks to its platform. We thus obtain a PLC that can work autonomously, without requiring the continuous supervision of a person. The operator intervention time on the PLC is therefore reduced, the operator can in turn perform a treatment of the surface at the same time as the automaton, for example by carrying out the work requiring a technicality or knowledge. -make special such as the treatment of specific surfaces such as the outline of wall outlets, the back of pipes, stairs, ... Preferably, the platform and the base remain stationary when the processing means is moved, and / or, the processing means remains stationary when the platform or the base is moved. The movements of the automaton are deliberately programmed to be carried out separately from one another, ie: it is the platform and / or the base that move, or it is the means of treatment, but not both. same time. It then becomes easier for the people who can work around the automaton to anticipate and anticipate its movements, in order to act accordingly. Preferably, the treatment of a surface to be treated is a painting of said surface to be treated, a sanding of said surface to be treated and / or a projection of coating on said surface to be treated. Preferably, the automaton also comprises one or more location sensors in space and with respect to the surface to be treated, for example ultrasonic sensors, laser sensors, flight time sensors, video systems or even sensors with tags delimiting at least a portion of the surface to be treated. The purpose of the sensors is to favor the positioning of the automaton in its environment, in order to facilitate the identification of the surfaces to be treated, as well as the delimitation of said surfaces in space. Preferably, the automaton also comprises presence sensors and is configured to limit or even avoid contacts with potential obstacles, for example with people. Such sensors are intended in particular to preserve the physical integrity of people likely to work or be near the controller. Thus, thanks to the various sensors, the controller is configured to detect the presence of such people, and act accordingly so as not to hurt or annoy them. The automaton then becomes collaborative, since it can come to support people in their work, by doing theirs nearby. The term "collaborative" means an automaton able to work close to individuals, without barriers. Brief description of the drawings The invention and its advantages will be better understood on reading the detailed description of a particular embodiment, taken by way of non-limiting example and illustrated by the appended drawings in which: FIG. a schematic perspective view of an automaton according to the present invention, - Figures 2 to 4 show a first mode of selection of a surface via the interface of the automaton, according to the present invention, and - Figures 5 and 6 show a second mode of selection of a surface via the PLC interface, according to the present invention. Detailed description of the invention FIG. 1 schematically illustrates a perspective view of a treatment automaton of a surface to be treated, according to the present invention. The automaton 1 thus comprises a base 2 for moving the automaton on the ground, and comprising: displacement means, in this case wheels 4, and a wheel drive means, for example an engine (not shown) The base 2 constitutes a frame for the PLC 1, and supports all the elements of the PLC 1 which are described below. The controller 1 also comprises a platform 6. The platform 6 is mounted on the base 2, for example via two rails 8 mounted substantially perpendicular to the base 2. The platform 6 is configured to move along the rails 8, that is to say substantially perpendicular to the base 2, by means of a drive means (not shown), for example a motor. A platform 6 is thus obtained which can move for example vertically to reach different heights of the surface to be treated. The automaton 1 also comprises a processing means, in this case an arm 10. The arm 10 is mounted on the platform 6 and comprises on the one hand a processing end 12 at which the tool is mounted. treatment of the surface 14, in this case a paint spray nozzle 14, and secondly one or more joints 16 connecting one or more arm portions 18. The joints 16 can move and orient the tool of treatment 14 as desired, over an entire area of given area. The area depends in particular on the length of the arm portions 18 and the amplitude of the joints 16. Moreover, the arm 10 also makes it possible to move the treatment tool 14 parallel to the surface to be treated, in order to obtain a result uniform treatment. Finally, the controller 1 comprises a control unit 20. The control unit 20 can be mounted on the base 2 of the controller 1, or be located remotely (relocated), or be partly mounted on the base 2 and partly relocated. The control unit 20 controls the various means of the controller 1, including the drive means of the base 2, the platform 6 and the arm 10. The commands are determined in particular by the control unit 20 according to instructions and data communicated to him. More specifically, the electronic control unit 20 is configured to plan the treatment of the surface to be treated, taking into account the structure of the PLC 1, and facilitating the work of individuals near the PLC 1. Thus, the control unit 20 may for example be configured to, first, divide the surface to be treated into subdivisions of area less than or equal to the given area. In other words, the surface to be treated is subdivided into portions that can be treated individually only by moving the arm, the platform 6 and the base 2 remaining stationary. Then, the control unit is configured to treat the surface of each subdivision by controlling the movement of the arm 10. When the subdivision is processed, the electronic control unit 20 then commands to change subdivision by moving the platform 6 vertically, and / or by moving the base 2 on the ground. In such a case, the automaton 1 works by subdivision, or cell, each subdivision corresponding to the surface that can be processed by the single movement of the arm 10 of the automaton 1. Then, the automaton 1 moves from subdivision to subdivision , by moving the platform 6 and / or the base 2. The subdivisions can be obtained by cutting the surface to be treated according to a regular grid whose lines correspond to the movements of the platform 6 and the base 2, in this case vertical and horizontal lines. Once cutting of the surface to be processed by the control unit 20, it can then control the controller 1 to successively treat said different subdivisions. Preferably, all the subdivisions corresponding to the same position of the base 2, that is to say requiring only the displacement of the arm 10 and the platform 6, are treated successively. Then, the base 2 is moved to another position to successively treat all the corresponding subdivisions, and so on. In this way, the movements on the ground of the automaton 1 are limited, which favors the work of the automaton 1 with individuals nearby. Within each subdivision, the processing applied by the PLC 1 can also be planned, in particular in order to give a rendering close to that provided by a professional. For example, in the case of a painting treatment, the control unit 20 can be configured to control the treatment first of an edge or contour of the surface to be treated: such treatment does not apply when the subdivision in question is positioned at the edge of the surface to be treated, and is irrelevant if the subdivision in question is entirely surrounded by other subdivisions. Such a method corresponds to rechampi, a technique of first working the contours of the surface before working the center. Once the contour is achieved, the control unit 20 can then control the arm 10 so as to process the remainder of the subdivision area, i.e., the interior of the subdivision. For this work, the control unit 20 can in particular provide for a displacement of the arm 10 in a horizontal or vertical grid, that is to say, to treat the interior of the subdivision along certain contour lines of said subdivision (contours horizontal or vertical). Likewise, when the subdivision has a particular element, such as a switch or an electrical outlet, the same technique can be used: the control unit 20 can be configured to perform the treatment along the contour of the particular element, before to perform the treatment between the particular element and the contour of the subdivision. When all the subdivisions are processed, the PLC 1 can then stop. It should be noted that in the example described above, the surface to be treated is a single surface. However, the work of the machine 1 according to the invention does not stop at such surfaces, but can treat a surface to be treated with several separate and separate parts and each other. In this case, each part of the surface to be treated is worked as indicated above, that is to say is specifically divided into subdivisions that are worked successively. When a part is finished, the control unit 20 controls the base 2 and / or the platform 6 so as to move to another non-treated part of the surface to be treated. Such a displacement can in particular operate, in practice, on the one hand by attributing to each distinct part of the surface to be treated, a specific reference work used by the controller 1 for the treatment of said distinct part of surface, and on the other hand by positioning, in a single global repository, the different specific reference frames of work between them, in order to allow the automaton 1 to move from a distinct part of the surface to be treated to another distinct part of the surface to be treated. For example, the different surface portions to be treated may be two walls of a room, for example two walls contiguous but having an angle between them, or two parallel walls distant facing each other. In both cases, the controller 1 is forced to reorient with respect to the portion of the surface to be treated, as it passes from one surface portion to another, before starting the treatment of said surface portion. In order to select the surfaces to be processed by the PLC, the interface thereof comprises a screen 21 configured to display a portion of the part in which the controller must work. Such a display is illustrated in particular in the various FIGS. 2 to 6. Figures 2 to 4 illustrate a first mode of selecting a surface via the interface. In this first selection mode, it is considered that the interface is configured to identify the different surfaces displayed on its screen 21. Thus, Figure 2 illustrates a part portion as displayed by the screen 21 of the interface. The part portion may comprise for example three walls 22, a floor 24, a ceiling 26, and a switch 30 and a delimited portion of wall 32, in this case a window for example. Each of its elements is delimited by a known closed contour of the interface, and thus constitutes a determined surface for the interface. The different surfaces determined are thus displayed on the screen 21 of the interface, and can then be selected by the operator, for example with his finger if the screen 21 is touch, or with the aid of a device pointing. FIG. 3 illustrates the display of the screen 21 after selection of a selected determined surface, in particular that of the wall 22 of the bottom. The closed contour of the wall 22 of the bottom also including the switch 30 and the wall portion 32, the three determined surfaces are considered selected by the interface. Nevertheless, it is also possible to exclude the treatment of the determined surface corresponding to the switch 30 and that corresponding to the wall portion 32, by re-selecting them specifically when they have been previously selected. These two determined surfaces 30, 32 are then excluded from the surface to be treated by the automaton (see FIG. 4). It is thus possible quickly and easily to indicate to the interface the surface or surfaces to be processed by the PLC 1. Figures 5 and 6 illustrate a second mode of selecting a surface via the interface. In this second selection mode, it is considered that the interface is not configured to identify the different surfaces displayed on its screen 21. In this second selection mode, the part portion displayed by the screen 21 of the interface may be the result of a scan of the room by a scanner or a 3D camera. Alternatively, the portion of the displayed part may be a representation of the part obtained by a photo or directly drawn by the operator, supplemented by information entered by the operator such as the dimensions or the geometry of the different elements displayed at the screen. Figure 5 illustrates a part portion as displayed by the screen 21 of the interface. The part portion may comprise in particular a wall 34, a portion 36 of which has been delimited by a contour 38 visible on the wall 34. The part of the part displayed on the screen 21 may in particular be obtained by a scanner or by a photograph of the room. In order to select a surface to be treated, and in particular the surface 36 delimited by the contour 38, the operator then draws on the screen, a closed contour 40 defining a delimited surface 42. The contour 40 being made by hand by the operator, it does not exactly match the contour 38 of the surface 36 to be treated. The interface can then modify the contour 40 of the delimited surface 42, so as to match it with singularities displayed on the screen 21, in this case with the contour 38. The selected surface is then modified so as to correspond on the surface 36 which is the one to be processed by the controller 1 (FIG. 6). It is thus possible to indicate to the interface the surface or surfaces to be processed by the controller 1, without requiring prior modeling of the part. Whatever the mode of selection of the surface to be treated, the control unit 20 can also allow an operator to specify the tasks to be performed and their parameters, as well as to be aware of the various status messages or alerts detected by the control unit 20. Thus, the control unit 20 can allow the operator to specify: the processing parameters, for example sanding (speed, effort, ...), or painting (number of layers to apply, type of paint, quantity of paint, pattern, interlacing of layers, overlapping of two contiguous passes, ...); the different areas of the surface to be treated, especially when the treatment parameters must not be uniform over the entire surface to be treated, but must change according to determined data. In order to allow the controller 1 to locate and move in space to treat the different surfaces, it may include sensors. The sensors can be of different technologies, depending on the amplitude and / or the precision of the distances concerned. Thus, the automaton 1 can comprise two distance sensors, for example ultrasonic sensors, mounted in the treatment plane of the arm 10 and making it possible to determine, on the one hand, the distance between the surface to be treated and the controller 1, and on the other hand the angle between the axis of the automaton 1 and the surface to be treated. This ensures, thanks to these sensors, that the arm 10 performs the treatment well at the right distance from the surface to be treated, and by moving parallel thereto. Alternatively, when the treatment requires contact with the surface to be treated, for example a sanding, the determination of the distance to the surface to be treated, and optionally the angle between the axis of the controller 1 and the surface to be treated, can be directly evaluated by the treatment tool, from internal resistance sensors used to control the force applied to the surface to be treated. The automaton 1 may also include flight time sensors, for example laser sensors, to control the position of the controller 1 in its environment. For this purpose, beacons can also be positioned at different locations identified by the controller 1, to ensure that it is well in front of the surface portion to be treated. Such sensors also make it possible to ensure that the movements of the base 2 are parallel to the surface to be treated, so that the junctions between the different subdivisions coincide. Alternatively, it is also possible to provide, in addition to sensors or as a replacement for sensors, one or more cameras allowing the controller 1 to position itself in its environment in three dimensions. Thus, two stereoscopically positioned cameras can allow the control unit 20 to be in space by determining the distance and the angle separating the controller 1 from the surfaces to be treated or the surfaces delimiting its moving environment. This may also allow the automaton 1 to move from one part of the surface to be treated to another, when these are distinct and separated from one another, as described above. In all cases, a preliminary step of calibration of the initial position of the controller 1 in its environment, may be necessary for the implementation of the positioning and positioning steps during the treatment of the surface to be treated. Finally, the controller 1 may also include presence sensors, to ensure that the controller 1 can work close to individuals without hitting or injuring them. For example, the controller 1 may comprise optical sensors forming a barrier between the evolution zone of the PLC 1, and more particularly the evolution zone of the platform 6 and the arm 10, and the rest of the part . Thus, in the event of detection of an intrusion of an object in said zone of evolution, the control of the platform 6 and the arm 10 can be interrupted in order to ensure that it does not injure anyone or damage the PLC 1. Furthermore, or alternatively, the control unit 20 can monitor the control of the various means of movement of the controller 1, for example those of the base 2 or the platform 6, in order to detect a possible obstruction to a displacement command. In this case, the command can be stopped, or even reversed, and the controller 1 can wait until a person comes to check the reasons for the obstruction. It is then ensured that the automaton 1 can actually evolve in the middle of individuals without the risk of hurting them. Thus, thanks to the invention, it becomes possible to treat, with the aid of an automaton, a surface to be treated, while allowing individuals to intervene in the vicinity of the automaton. In particular, the entry of the instructions relating to the identification of the surfaces to be treated is made easy thanks to the specific interface of the automaton. The automaton can thus serve as an assistant within a construction site, in order to perform in particular the most repetitive tasks and without particular technicality. It may in particular perform a painting treatment, for example by spraying paint on the surface, or sanding, for example by rotating an abrasive means to the surface to be treated, or even the application of coating, by example by projection.
权利要求:
Claims (11) [1" id="c-fr-0001] An automaton (1) for treating a surface to be treated, comprising a processing means (10), for example an arm, comprising a movable end (12) configured to treat a surface, and an interface configured to indicate to the automaton (1) the surface to be treated, characterized in that the interface comprises a screen (21) configured to display a representation of at least a part of the part in which the surface to be treated is located, and in that that the interface is configured to allow a person to select the surface to be treated on the representation displayed on the screen (21). [2" id="c-fr-0002] 2. The controller (1) according to claim 1, wherein the interface is also configured to display, for example graphically or textually, information relating to the operation of the controller (1), for example its operating parameters or anomalies. . [3" id="c-fr-0003] 3. The controller (1) of claim 1 or 2, wherein the interface is configured to identify, on the part of the part displayed by the screen, at least one determined surface surrounded by a closed contour. [4" id="c-fr-0004] 4. The controller (1) according to the preceding claim, wherein the interface is configured to allow a person: to select the surface to be treated, by selection, on the representation displayed on the screen (21), at least a given surface surrounded by a closed contour, and optionally to exclude at least one portion not to treat said determined surface, by selection, on the displayed display representation (21), of a determined portion surrounded by a closed contour located within said determined surface. [5" id="c-fr-0005] 5. The controller (1) according to any one of claims 1 to 3, wherein the interface is configured to allow a person: to select the surface to be treated, by embodiment, on the representation displayed on the screen (21 ), at least one closed contour defining a delimited surface, and optionally to exclude at least one portion not to treat said delimited surface, by embodiment, on the representation displayed on the screen (21), a closed contour defining a delimited portion located inside said delimited surface. [6" id="c-fr-0006] 6. The controller (1) according to claim 5, wherein the interface is configured to identify, on the part of the piece displayed by the screen, at least one singularity, for example one or more corners or one or more edges, and wherein the interface is configured to change the closed contour of at least one delimited surface to match one or more of said singularities. [7" id="c-fr-0007] 7. An automaton (1) according to any one of the preceding claims, wherein the screen (21) of the interface is tactile and / or wherein the interface comprises a manual pointing device, distinct from the screen ( 21), such as a mouse or touchpad [8" id="c-fr-0008] An automaton (1) according to any one of the preceding claims, wherein the screen (21) is configured to display the representation of at least a portion of the room in which the surface to be treated is located, from a 3D modeling file of said part. [9" id="c-fr-0009] An automaton (1) as claimed in any one of the preceding claims, wherein the screen (21) is configured to display the representation of at least a portion of the workpiece in which the surface to be treated is located, from data obtained in the room, for example received from a scanning means of said room such as a scanner or a 3D camera. [10" id="c-fr-0010] 10. An automaton (1) according to any preceding claim, further comprising: - a base (2) configured to move on the ground, and - a platform (6) mounted on the base and configured to move, at least in part, perpendicular to the base, for example vertically, and wherein the processing means (10) is mounted on the platform (6). [11" id="c-fr-0011] 11. The controller (1) according to any one of the preceding claims, further comprising presence sensors and configured to limit or even avoid contact with potential obstacles, for example with people.
类似技术:
公开号 | 公开日 | 专利标题 FR3050672A1|2017-11-03|AUTOMATE FOR TREATING A SURFACE EP3448582B1|2020-05-20|Method for treating a surface and corresponding automated device CA2674317C|2015-05-26|Method and system allowing the automatic picking of parts CA2989154A1|2016-12-22|System and method for automatically inspecting surfaces CN109288455A|2019-02-01|Refuse sweeping method and device FR3025918A1|2016-03-18|METHOD AND SYSTEM FOR AUTOMATED MODELING OF A PART CA2985713A1|2016-11-17|Method of learning cutting by combining simulation entities, and hybrid implementation platform JP2019005825A|2019-01-17|Automatic polishing system WO2009090542A2|2009-07-23|Method for training a robot or the like, and device for implementing said method FR2730058A1|1996-08-02|METHOD FOR NON-DESTRUCTIVE CONTROL OF A SURFACE, ESPECIALLY IN A HOSTILE ENVIRONMENT EP1589354B1|2015-08-26|System for three-dimensional measurements EP3495316A1|2019-06-12|Vehicle comprising a device to assist the handling of a load and corresponding method WO2021181035A1|2021-09-16|Method for automatically performing an operation on an object with a tool carried by a polyarticulated system EP3113899B1|2019-11-20|System for and method of monitoring and characterizing manual welding operations KR102268579B1|2021-06-23|Apparatus and method for controlling painting robot using motion pattern of painting worker EP3658998A1|2020-06-03|Surface inspection method FR3035803A1|2016-11-11|DEVICE FOR THREE DIMENSIONAL STITCHING CARRIED OUT BY ASSOCIATING A CARRIER EQUIPMENT, A ROBOTIC EQUIPMENT AND A MEASURING SYSTEM. FR2935814A1|2010-03-12|Sheet metal object e.g. shielded metal door, fabricating method for building, involves programming control of numerically-controlled machine-tool based on verified data, for realizing machining operation in computerized manner JP2020149186A|2020-09-17|Position attitude estimation device, learning device, mobile robot, position attitude estimation method, and learning method DE102012015324A1|2014-02-06|Method for determining the position of an object FR3101803A1|2021-04-16|Robot for the renovation by stripping and / or paint coating, and / or the inspection of a wall of large area and / or high height, Associated operating process and application to the stripping and painting of ship hulls. FR3080184A1|2019-10-18|METHOD FOR MEASURING THE CORROSION OF A WORKPIECE AND APPLICATION TO THE CONTROL OF THE MECHANICAL RESISTANCE OF A CORRODEE PIECE De-Becker2020|The use of hybrid manufacturing in the repair and maintenance of railway lines EP1861756B1|2010-12-15|Method for guiding a robot and corresponding device EP3928292A1|2021-12-29|Method and device for monitoring the environment of a robot
同族专利:
公开号 | 公开日 US20190118209A1|2019-04-25| EP3448636A1|2019-03-06| WO2017187105A1|2017-11-02| FR3050672B1|2018-11-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPS6211908A|1985-07-10|1987-01-20|Hitachi Ltd|Teaching system for robot action| US20140303775A1|2011-12-08|2014-10-09|Lg Electronics Inc.|Automatic moving apparatus and manual operation method thereof| US20130343640A1|2012-06-21|2013-12-26|Rethink Robotics, Inc.|Vision-guided robots and methods of training them| US20140135986A1|2012-11-14|2014-05-15|Fanuc America Corporation|Teaching point program selection method for robot simulator| EP2862678A2|2013-10-17|2015-04-22|Kabushiki Kaisha Yaskawa Denki|Teaching system and teaching method| US20150375390A1|2014-03-06|2015-12-31|Encore Automation|Robotic system for applying surface finishes to large objects| DE4231766A1|1992-09-23|1994-03-24|Licentia Gmbh|Method for entering and displaying the setting parameters of a device for coating objects| US5429682A|1993-08-19|1995-07-04|Advanced Robotics Technologies|Automated three-dimensional precision coatings application apparatus| EP1422170B1|1997-05-30|2006-07-26|Hino Jidosha Kogyo Kabushiki Kaisha|Multi-color small amount painting system| US6484121B1|2000-09-07|2002-11-19|Ford Global Technologies, Inc.|System for automatically measuring paint film thickness| FR2902038B1|2006-06-07|2009-02-27|Bouygues Construction Sa|CEILING SANDING ROBOT| ITPI20120062A1|2012-05-21|2013-11-22|Cmo Di Sodini Dino & C S N C|METHOD FOR THE PAINTING OF OBJECTS AND EQUIPMENT CARRYING OUT THIS METHOD| CA2901691A1|2013-02-25|2017-02-27|John Grimes|Automated paint application system and related method| GB2531576B|2014-10-22|2018-04-25|Q Bot Ltd|Modular Robot| US9857888B2|2015-03-17|2018-01-02|Behr Process Corporation|Paint your place application for optimizing digital painting of an image| US10339233B2|2015-07-27|2019-07-02|Siemens Industry Software Ltd.|Calculating thicknesses of applied coating material|US10526799B2|2017-03-31|2020-01-07|Canvas Construction, Inc.|Automated drywall cutting and hanging system and method| EP3687742A4|2017-09-25|2021-06-16|Canvas Construction, Inc.|Automated wall finishing system and method| CN108789426A|2017-12-29|2018-11-13|金门工程建设有限公司|The mechanical people of surface treatment| DE102018100944A1|2018-01-17|2019-07-18|Eleggs Gmbh Consulting + Components|Arrangement for the vertical movement of an industrial robot and industrial hall equipped therewith| DE102018123416A1|2018-09-24|2020-03-26|Dinah Isabel Spitzley|System and device for processing surfaces of buildings| FR3101803B1|2019-10-15|2021-10-08|Ambpr|Robot for the renovation by stripping and / or paint coating, and / or the inspection of a wall of large area and / or high height, Associated operating process and application to the stripping and painting of ship hulls.| CN110842939B|2019-11-19|2021-07-02|广东博智林机器人有限公司|Polishing robot| CN112007789A|2020-08-04|2020-12-01|中国石油天然气集团有限公司|Prefabricated welding seam coating robot|
法律状态:
2017-03-16| PLFP| Fee payment|Year of fee payment: 2 | 2017-11-03| PLSC| Search report ready|Effective date: 20171103 | 2018-03-06| PLFP| Fee payment|Year of fee payment: 3 | 2020-01-30| PLFP| Fee payment|Year of fee payment: 5 | 2021-03-12| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1653918A|FR3050672B1|2016-04-29|2016-04-29|AUTOMATE FOR TREATING A SURFACE| FR1653918|2016-04-29|FR1653918A| FR3050672B1|2016-04-29|2016-04-29|AUTOMATE FOR TREATING A SURFACE| EP17725690.6A| EP3448636A1|2016-04-29|2017-04-28|Automaton for treating a surface| PCT/FR2017/051018| WO2017187105A1|2016-04-29|2017-04-28|Automaton for treating a surface| US16/096,772| US20190118209A1|2016-04-29|2017-04-28|Automaton for treating a surface| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|